97 research outputs found

    Continuous Time Adaptive Filtering

    Get PDF
    This paper deals with the problem of adaptive estimation in the continuous time Kalman filtration scheme. The necessary and sufficient conditions of the convergence of the parameter estimators are discussed. For systems which are characterized by constant but unknown parameters, the conditions of convergence can be checked before the observation start. The method of proof is based on the relations between singularity property of some probability measures and convergence of the Bayesian estimation algorithm

    Modeling Medical Manpower Allocation in Centralized Systems

    Get PDF
    demands careful quantitative and qualitative analysis of the subsystems functioning processes. One of the main subsystems in the general health care model studied is the manpower system. For the creation of the man-power model, different statistical data are required. Before using these statistical data in modeling it is necessary to investigate some of the socio-economical mechanisms influencing the man-power spatial distribution, and to describe mathematically the processes defining that the state of the system under consideration is dynamic. This paper is devoted to the mathematical description of one such mechanism and includes a computer program based on the equations used. This investigation may be considered as the first stage in the creation of the corresponding computer model. The development of the centralized type socio-economical systems are defined in general by their well-grounded perspective plans for resource allocation most importantly their man-power allocation. The complexity of human demands makes it necessary to take into account under planning not only the real demands on man-power of the different regions but also the specific regional conditions influencing the migration processes. Lack of consideration of these conditions may lead to large mistakes in planning

    Dynamics in Survival Analysis: Conditional Gaussian Property versus Cameron-Martin Formula

    Get PDF
    This paper describes the stochastic process model for mortality rates of the population. The key question is the relationship between conditional and unconditional survival functions. The Cameron and Martin solution to the problem is compared to the solution based on the Conditional Gaussian Approach. The advantages of the Gaussian approach are discussed. The proof of the main formula for averaging uses the martingale specification of the random hazard rate

    The Expected Number of Transitions from One State to Another: A Medico-Demographic Model

    Get PDF
    Medico-demographic models are used to describe the dynamic properties of a population's health status. In these models the human population is represented as a number of interacting social groups of individuals whose dynamic properties are birth, aging, death, and the transition of an individual from one state to another. The probability of these transitions plays a central role in the analysis of a population's health status. This paper concentrates on the expected number of transitions between states of selected groups of individuals and other variables from both discrete and continuous time models using the Markovian assumption. Correlation properties of the variables generated by the transition properties are also investigated. The derived formulas and properties may help the health care decision maker to estimate the expected frequency of hospitalization and the expected number of visits to physicians during a selected time interval. It also gives a reasonable basis for calculating health care resource demands within the limits of the assumptions used. Forecasting transition probabilities helps in detecting possible future problems that may arise in a health care system

    On the Notion of Random Intensity

    Get PDF
    The problems of explaining the observed trends in mortality, morbidity and other kinds of individuals' transitions generated the numerous attempts of incorporating the covariates into the survival models. First models use the deterministic constant factors as explanatory variables. Gradually it became clear that the random and dynamic nature of the covariates should also be taken into account. This understanding has led to the fact that the notion of random intensity became widely used in the analysis of the asymptotic properties of the maximum likelihood and Coxregression estimators. Having the clear intuitive sense the notion of random intensity can be introduced in different ways. The traditional way is to define the intensity in terms of probability distributions of the failure time. Another way appeals to the martingale theory and defines the intensity in terms of the predictable process, called "compensator". For the deterministic rates and simple cases of stochastic intensities there are already results that establish a one-to-one correspondence between two definitions. The correspondence is reached by the probabilistic representation results for compensator. Martingale theory guarantees the existence of the predictable compensator in more general cases. However the results on the probabilistic representation similar to simple cases are still unknown. Meanwhile such representation is crucial, for instance, in the analysis of the relations between the duration of the life cycle of some unit and stochastically changing influential variables. This paper shows the result of such representation for some particular case. The generalization on the more general situations is straightforward. The consideration will use some basic notions of a "general theory of processes"

    Chances of Survival in a Chaotic Environment

    Get PDF
    In this paper, Anatoli Yashin examines mathematically how individual differences in frailty ("susceptibility" to death), defined as a quadratic function of the environmental factors, affect the mortality rates in a population. He goes on to show how our chances of survival depend on the extent of our knowledge about the processes affecting death

    Some Aspects of Model Tuning Procedure: Information-Theoretic Analysis

    Get PDF
    Computer or mathematical models are not exact representation of reality: lack of knowledge, technical restrictions and particular modeling goals make it necessary to approximate the real system in various ways. Nevertheless, the procedures by which the models are adjusted to observed data are often based on the assumptions that the real system has the same structure as the model and differs only in the values of certain parameters. These particular values usually should be included in the feasible set of the parameter values, and this fact, together with some additional conditions, usually provides the convergence property for many individual algorithms. However, in reality all of these assumptions are generally false. Even if the structure of the system corresponds to the structure of the model, the real parameters values often do not belong to the presupposed feasible set. Moreover, mathematicians often consciously diminish this set in order to simplify the estimation algorithms. For instance they approximate the bounded compact set of parameter values by a set consisting of a finite number of points, thus increasing the chances that the real parameter values will be excluded. It is therefore both remarkable and surprising to find that despite these false assumptions and approximations, the parameter estimation algorithms often still converge! The model resulting from this tuning procedure will of course not coincide with the real system. and this rises the natural question: how far is this computer model from reality? When considering this question it is necessary to have some way of measuring the distance between individual models. One of measure of divergence was introduced by Bhattacharya; Kullback also formulated some measure of information distance. However these measures were not proper metrics. Baram and Sandell later introduced a modified version of Kullback measure, which have been shown to be a proper distance metric. They applied this approach to linear Gaussian systems and models; in this paper it is generalized to a wider class of systems...

    Hazard Rates and Probability Distributions: Representation of Random Intensities

    Get PDF
    Recent attempts to apply the results of martingale theory in probability theory have shown that it is first necessary to interpret this abstract mathematical theory in more conventional terms. One example of this is the need to obtain a representation of the dual predictable projections (compensators) used in martingale theory in terms of probability distributions. However, up to now a representation of this type has been derived only for one special case. In this paper, the author gives probabilistic representations of the dual predictable projection of integer-valued random measures that correspond to jumps in a semimartingale with respect to the sigma-algebras generated by this process. The results are of practical importance because such dual predictable projections are usually interpreted as random intensities or hazard rates related to jumps in trajectories: applications are found in such fields as mathematical demography and risk analysis

    Evaluation of Danger or How Knowledge Transforms Hazard Rates

    Get PDF
    In this paper, Anatoli Yashin examines how changes in information about the risks associated with possible future events formally transform the chances of these events occurring. He describes an analytical tool for the probabilistic analysis of hazard rates under various assumptions concerning the available information
    • …
    corecore